Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Author index

Page Path
HOME > Browse articles > Author index
Search
Sean Tackett 2 Articles
Feasibility of clinical performance assessment of medical students on a virtual sub-internship in the United States  
John Woller, Sean Tackett, Ariella Apfel, Janet Record, Danelle Cayea, Shannon Walker, Amit Pahwa
J Educ Eval Health Prof. 2021;18:12.   Published online June 22, 2021
DOI: https://doi.org/10.3352/jeehp.2021.18.12
  • 4,727 View
  • 292 Download
  • 1 Web of Science
  • 1 Crossref
AbstractAbstract PDFSupplementary Material
We aimed to determine whether it was feasible to assess medical students as they completed a virtual sub-internship. Six students (out of 31 who completed an in-person sub-internship) participated in a 2-week virtual sub-internship, caring for patients remotely. Residents and attendings assessed those 6 students in 15 domains using the same assessment measures from the in-person sub-internship. Raters marked “unable to assess” in 75/390 responses (19%) for the virtual sub-internship versus 88/3,405 (2.6%) for the in-person sub-internship (P=0.01), most frequently for the virtual sub-internship in the domains of the physical examination (21, 81%), rapport with patients (18, 69%), and compassion (11, 42%). Students received complete assessments in most areas. Scores were higher for the in-person than the virtual sub-internship (4.67 vs. 4.45, P<0.01) for students who completed both. Students uniformly rated the virtual clerkship positively. Students can be assessed in many domains in the context of a virtual sub-internship.

Citations

Citations to this article as recorded by  
  • Association of Virtual Away Rotations With Residency Applicant Outcomes in Otolaryngology
    Nicholas R. Lenze, William J. Benjamin, Lauren A. Bohm, Marc C. Thorne, Michael J. Brenner, Angela P. Mihalic, Robbi A. Kupfer
    OTO Open.2023;[Epub]     CrossRef
Profiling medical school learning environments in Malaysia: a validation study of the Johns Hopkins Learning Environment Scale  
Sean Tackett, Hamidah Abu Bakar, Nicole A. Shilkofski, Niamh Coady, Krishna Rampal, Scott Wright
J Educ Eval Health Prof. 2015;12:39.   Published online July 9, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.39
  • 30,189 View
  • 171 Download
  • 10 Web of Science
  • 5 Crossref
AbstractAbstract PDF
Purpose
While a strong learning environment is critical to medical student education, the assessment of medical school learning environments has confounded researchers. Our goal was to assess the validity and utility of the Johns Hopkins Learning Environment Scale (JHLES) for preclinical students at three Malaysian medical schools with distinct educational and institutional models. Two schools were new international partnerships, and the third was school leaver program established without international partnership. Methods: First- and second-year students responded anonymously to surveys at the end of the academic year. The surveys included the JHLES, a 28-item survey using five-point Likert scale response options, the Dundee Ready Educational Environment Measure (DREEM), the most widely used method to assess learning environments internationally, a personal growth scale, and single-item global learning environment assessment variables. Results: The overall response rate was 369/429 (86%). After adjusting for the medical school year, gender, and ethnicity of the respondents, the JHLES detected differences across institutions in four out of seven domains (57%), with each school having a unique domain profile. The DREEM detected differences in one out of five categories (20%). The JHLES was more strongly correlated than the DREEM to two thirds of the single-item variables and the personal growth scale. The JHLES showed high internal reliability for the total score (α=0.92) and the seven domains (α= 0.56-0.85). Conclusion: The JHLES detected variation between learning environment domains across three educational settings, thereby creating unique learning environment profiles. Interpretation of these profiles may allow schools to understand how they are currently supporting trainees and identify areas needing attention.

Citations

Citations to this article as recorded by  
  • Validation of the Polish version of the DREEM questionnaire – a confirmatory factor analysis
    Dorota Wójcik, Leszek Szalewski, Adam Bęben, Iwona Ordyniec-Kwaśnica, Sue Roff
    BMC Medical Education.2023;[Epub]     CrossRef
  • Association between patient care ownership and personal or environmental factors among medical trainees: a multicenter cross-sectional study
    Hirohisa Fujikawa, Daisuke Son, Takuya Aoki, Masato Eto
    BMC Medical Education.2022;[Epub]     CrossRef
  • Measuring Students’ Perceptions of the Medical School Learning Environment: Translation, Transcultural Adaptation, and Validation of 2 Instruments to the Brazilian Portuguese Language
    Rodolfo F Damiano, Aline O Furtado, Betina N da Silva, Oscarina da S Ezequiel, Alessandra LG Lucchetti, Lisabeth F DiLalla, Sean Tackett, Robert B Shochet, Giancarlo Lucchetti
    Journal of Medical Education and Curricular Development.2020; 7: 238212052090218.     CrossRef
  • Developing an Introductory Radiology Clerkship at Perdana University Graduate School of Medicine in Kuala Lumpur, Malaysia
    Sarah Wallace Cater, Lakshmi Krishnan, Lars Grimm, Brian Garibaldi, Isabel Green
    Health Professions Education.2017; 3(2): 113.     CrossRef
  • Trainers' perception of the learning environment and student competency: A qualitative investigation of midwifery and anesthesia training programs in Ethiopia
    Sharon Kibwana, Rachel Haws, Adrienne Kols, Firew Ayalew, Young-Mi Kim, Jos van Roosmalen, Jelle Stekelenburg
    Nurse Education Today.2017; 55: 5.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions